AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
2.7 billion parameters

# 2.7 billion parameters

Gpt Neo 2.7B
MIT
GPT-Neo 2.7B is a 2.7 billion parameter Transformer language model replicated by EleutherAI based on the GPT-3 architecture, trained on the Pile dataset
Large Language Model English
G
EleutherAI
52.68k
486
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase